Neural Network Learning Through Optimally Conditioned Quadratically Convergent Methods Requiring NO LINE SEARCH

نویسنده

  • Homayoon S.M. Beigi
چکیده

Neural Network Learning algorithms based on Conjugate Gradient Techniques and Quasi Newton Techniques such as Broyden, DFP, BFGS, and SSVM algorithms require exact or inexact line searches in order to satisfy their convergence criteria. Line searches are very costly and slow down the learning process. This paper will present new Neural Network learning algorithms based on Hoshino's weak line search technique and Davidon's Optimally Conditioned line search free technique. Also, a practical method of using these optimization algorithms is presented such that they will avoid getting trapped in local minima for the most part. The global minimization problem is a serious one when quadratically convergent techniques such as Quasi Newton methods are used. Furthermore, to display the performance of the proposed learning algorithms, the more practical algorithm based on Davidon's minimization technique is used in conjunction with a cursive handwriting recognition problem. For comparison with other algorithms, also a few small benchmark tests are conducted and reported.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Quadratically Convergent Interior-Point Algorithm for the P*(κ)-Matrix Horizontal Linear Complementarity Problem

In this paper, we present a new path-following interior-point algorithm for -horizontal linear complementarity problems (HLCPs). The algorithm uses only full-Newton steps which has the advantage that no line searchs are needed. Moreover, we obtain the currently best known iteration bound for the algorithm with small-update method, namely, , which is as good as the linear analogue.

متن کامل

Chapter 15 IMPROVED LEARNING OF NEURAL NETS THROUGH GLOBAL SEARCH

Learning in artificial neural networks is usually based on local minimization methods which have no mechanism that allows them to escape the influence of an undesired local minimum. This chapter presents strategies for developing globally convergent modifications of local search methods and investigates the use of popular global search methods in neural network learning. The proposed methods te...

متن کامل

Armijo Newton method for convex best interpolation

More than a decade agao, Newton’s method has been proposed for constructing the convex best interpolant. Its local quadratic convergence has only been established recently by recasting it as the generalized Newton method for semismooth equations. It still remains mysterious that the Newton method coupled with line search strategies works practically well in global sense. Similar to the classica...

متن کامل

A quadratically convergent algorithm for finding the largest eigenvalue of a nonnegative homogeneous polynomial map

In this paper we propose a quadratically convergent algorithm for finding the largest eigenvalue of a nonnegative homogeneous polynomialmapwhere theNewtonmethod is used to solve an equivalent system of nonlinear equations. The semi-symmetric tensor is introduced to reveal the relation between homogeneous polynomial map and its associated semi-symmetric tensor. Based on this relation a globally ...

متن کامل

A Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network

Abstract   Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007